On Efficient Decoding and Design of Sparse Random Linear Network Codes

نویسندگان

  • Ye Li
  • Wai-Yip Chan
  • Steven D. Blostein
چکیده

Random linear network coding (RLNC) in theory achieves the max-flow capacity of multicast networks, at the cost of high decoding complexity. To improve the performance-complexity tradeoff, we consider the design of sparse network codes. A generation-based strategy is employed in which source packets are grouped into overlapping subsets called generations. RLNC is performed only amongst packets belonging to the same generation throughout the network so that sparseness can be maintained. In this paper, generation-based network codes with low reception overheads and decoding costs are designed for transmitting of the order of 10-10 source packets. A low-complexity overhead-optimized decoder is proposed that exploits “overlaps” between generations. The sparseness of the codes is exploited through local processing and multiple rounds of pivoting of the decoding matrix. To demonstrate the efficacy of our approach, codes comprising a binary precode, random overlapping generations, and binary RLNC are designed. The results show that our designs can achieve negligible code overheads at low decoding costs, and outperform existing network codes that use the generation based strategy.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Linear-Complexity Overhead-Optimized Random Linear Network Codes

Sparse random linear network coding (SRLNC) is an attractive technique proposed in the literature to reduce the decoding complexity of random linear network coding. Recognizing the fact that the existing SRLNC schemes are not efficient in terms of the required reception overhead, we consider the problem of designing overhead-optimized SRLNC schemes. To this end, we introduce a new design of SRL...

متن کامل

Decoding Algorithms for Random Linear Network Codes

We consider the problem of efficient decoding of a random linear code over a finite field. In particular we are interested in the case where the code is random, relatively sparse, and use the binary finite field as an example. The goal is to decode the data using fewer operations to potentially achieve a high coding throughput, and reduce energy consumption. We use an on-the-fly version of the ...

متن کامل

Tunable Sparse Network Coding

A fundamental understanding of the relationship between delay performance and complexity in network coding is instrumental towards its application in practical systems. The main argument against delay-optimal random linear network coding (RLNC) is its decoding complexity, which is O(n) for n original packets. Fountain codes, such as LT and Raptor codes, reduce the decoding load on the receiver ...

متن کامل

Advanced Algorithms December 23 , 2004 Lecture 20 : LT Codes Lecturer : Amin

Random linear fountain codes were introduced in the last lecture as a sparse-graph codes for channels with erasure. It turned out that their encoding and decoding costs were quadratic and cubic in the number of packets encoded. We study Luby Transform (LT) codes in this lecture, pioneered by Michael Luby, that retains the good performance of random linear fountain code, while drastically reduci...

متن کامل

Perpetual Codes for Network Coding

Random Linear Network Coding (RLNC) provides a theoretically efficient method for coding. Some of its practical drawbacks are the complexity of decoding and the overhead due to the coding vectors. For computationally weak and battery-driven platforms, these challenges are particular important. In this work, we consider the coding variant Perpetual codes which are sparse, non-uniform and the cod...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:
  • CoRR

دوره abs/1604.05573  شماره 

صفحات  -

تاریخ انتشار 2016